Versions:

  • 0.7.9
  • 0.7.8
  • 0.7.7
  • 0.7.6
  • 0.7.5
  • 0.7.4
  • 0.7.3
  • 0.7.2
  • 0.7.1
  • 0.7.0
  • 0.6.10
  • 0.6.9
  • 0.6.8
  • 0.6.7
  • 0.6.6
  • 0.6.5
  • 0.6.4
  • 0.6.3
  • 0.6.2
  • 0.6.1
  • 0.6.0
  • 0.5.17
  • 0.5.16
  • 0.5.15
  • 0.5.14
  • 0.5.13
  • 0.5.12
  • 0.5.11
  • 0.5.10
  • 0.5.9
  • 0.5.8
  • 0.5.7
  • 0.5.6
  • 0.5.5
  • 0.5.4
  • 0.5.3

Jan is an open-source desktop application that transforms any compatible computer into a private, offline AI workstation by enabling local execution of large language models such as Mistral-7B and Llama-2, while also offering optional cloud connectivity to remote inference endpoints including OpenAI’s GPT-4 and Groq. Positioned in the machine-learning & developer-tools category, the program delivers a single interface for downloading, configuring, and switching between quantized models so users can experiment with conversational agents, code generation, document summarization, or embedding workflows without transmitting data externally. Version 0.7.9, released as the thirty-sixth iterative build since the project’s inception, refines model memory management, improves CUDA and Apple Silicon acceleration, and adds an extensible plugin ecosystem that lets researchers integrate custom prompting frameworks or export chat histories to common formats. Typical use cases range from journalists who need confidential transcript analysis to engineers prototyping on-premises chatbots and students learning transformer architectures; enterprises likewise leverage Jan for air-gapped compliance environments where SaaS AI is disallowed. Because all inference happens locally by default, prompt content remains on disk, giving privacy-sensitive industries a self-contained alternative to browser-based services. The application is available for free on get.nero.com, with downloads provided via trusted Windows package sources (e.g. winget), always delivering the latest version, and supporting batch installation of multiple applications.

Tags: